Maybe a single, superintelligent civilization has indeed taken over the galaxy and has imposed strict radio silence because it's paranoid of any potential competitors.
Eventually, it may modify and upgrade itself, evolving into a superintelligent system that leaves humankind in the dust, perhaps more than just metaphorically.
Predicting that exponentially improving technologies will take us to a point of a singularity, beyond which superintelligent artificial intelligence will transform our world in nearly unimaginable ways.
A superintelligent AI is by definition very good at attaining its goals, so the most important thing for us to do is to ensure that its goals are aligned with ours.
And perhaps most interestingly, the extropians' commitment to taking superintelligent AI seriously, it led them to talk a lot about the risks of the technology and how they could mitigate them.
Even if created solely with humanity's best interests in mind, superintelligent AI could pose an existential risk if it isn't perfectly aligned with human values— a task scientists are finding extremely difficult.
即使在创造时完全考虑到人类的最利益,如果超级智能 AI 不能完全符合人类价值观,它也可能会带来存风险——科学家们发现这是项极其困难的任务。